Sobolev Orthogonal Polynomials in Two Variables and Second Order Partial Differential Equations

نویسنده

  • L. LITTLEJOHN
چکیده

We consider polynomials in two variables which satisfy an admissible second order partial differential equation of the form (*) Auxx + 2Buxy + Cuyy +Dux + Euy = u; and are orthogonal relative to a symmetric bilinear form de…ned by '(p; q) = h ; pqi+ h ; pxqxi ; where A; ; E are polynomials in x and y; is an eigenvalue parameter, and are linear functionals on polynomials. We …nd a condition for the partial di¤erential equation ( ) to have polynomial solutions which are orthogonal relative to a symmetric bilinear form '( ; ): Also examples are provided. 1. Introduction In 1967, Krall and She¤er [6] investigated a second order partial di¤erential equation of the form (1.1) A(x; y)uxx + 2B(x; y)uxy + C(x; y)uyy +D(x)ux + E(y)uy = u and classi…ed all weak orthogonal polynomials satisfying the partial di¤erential equation (1.1), where A(x; y), , E(y) are polynomials in x and y; and is an eigenvalue parameter. As a generalization, we consider polynomial solutions to the partial di¤erential equation (1.1) which are orthogonal relative to a symmetric bilinear form ( ; ) on polynomials de…ned by (1.2) (p; q) = h ; pqi+ h ; pxqxi ; where and are moment functionals, and p; q are polynomials in x and y. The case = 0 was investigated by Krall and She¤er. For the partial di¤erential equation (1.1) considered by Krall and She¤er, we know that (i) Cx = 0 (up to a linear change of independent variables) and (ii) partial derivatives with respect to x satisfy the partial di¤erential equation of the same type as the partial di¤erential equation (1.1) (see [4]). These facts remind us of the Hahn-Sonnine characterization theorem for classical orthogonal polynomials ([2, 5, 10]) which states that : The only polynomial sequences fPn(x)gn=0 (up to a complex change of variable) which are simultaneously orthogonal with respect to bilinear forms of the form (p; q)0 = R R p(x)q(x) d 0, (p; q)1 = R R p(x)q(x) d 0 + R R p 0(x)q0(x) d 1; 1991 Mathematics Subject Classi…cation. 33C50,35P99. Key words and phrases. Orthogonal polynomials in two variables, bilinear symmetric form, Sobolev orthogonal polynomials in two variables, second order partial di¤erential equations. 1 2 JEONGKEUN LEE AND L. L. LITTLEJOHN with i(i = 0; 1) real-valued signed Borel measures, are the classical orthogonal polynomials of Jacobi, Laguerre, Hermite and Bessel polynomials. Naturally they lead us to the problem of investigating polynomials orthogonal relative to ( ; ) in (1.2). But contrary to the classical orthogonal polynomials in one variable, orthogonal polynomials in two variables whose partial derivatives with respect to x or y are orthogonal dose not satisfy the partial di¤erential equation of the form (1.1) (See [3] for these materials). Instead, we consider polynomials in two variables which are orthogonal relative to a symmetric bilinear form ( ; ) in (1.2) and satisfy the partial di¤erential equation (1.1). In this paper, we give some basic facts on Sobolev orthogonal polynomials and the relationship between Sobolev orthogonal polynomials relative to a symmetric bilinear form ( ; ) in (1.2) and the partial di¤erential equation (1.1). Also, we give some examples of the partial di¤erential equation having Sobolev orthogonal polynomials as solutions. 2. Preliminaries: Basic Theory of Orthogonal Polynomials in Two Variables Let Pn be the space of all polynomials in x and y of degree n: The set of all polynomials in two variables is denoted by P. By a polynomial system (in short, PS), we mean a sequence f mn(x; y)gm;n=0 of polynomials such that deg mn = m+ n for each m;n 0 and f n j;jgj=0 is linearly independent modulo Pn 1:We denote f n j;j(x; y)gj=0 by an (n+1)-dimensional column vector n and a PS f mn(x; y)gm;n=0 by f ngn=0. We say that a PS f ngn=0 is monic if mn(x; y) = xy modulo Pm+n 1 for each m;n 0:. To a given PS f ngn=0; there corresponds a unique monic PS fPngn=0 which is de…ned by Pn = A 1 n n; where An = (ajk) n j;k=0 and n j;j(x; y) = Pn k=0 a n j;kx n y modulo Pn 1: It will be called the normalization of f ngn=0: A linear functional on P is called a moment functional. We denote the action of a moment functional on polynomial by h ; i instead of the customary ( ): Similarly, for a matrix Q = (Qi;j) with Qi;j being a polynomial, h ;Qi is de…ned to be the matrix (h ;Qi;ji). We see that ;AB = ;BA T for any column vectors A and B of polynomials. For a moment functional and any polynomial , we de…ne the partial derivatives of by the formulas (2.1) h@x ; i = h ; @x i; h@y ; i = h ; @y i for 2 P; and de…ne the multiplication on by a polynomial through the formula (2.2) h ; i = h ; i for 2 P: De…nition 2.1. A PS f ngn=0 is called an orthogonal basis (OB) relative to if there is a nonzero moment functional such that for all n 0 h ; n k i = 0; 2 Pn 1; 0 k n: And f ngn=0 is called a weak orthogonal polynomial set (WOPS) relative to if there is a nonzero moment functional such that h ; m;n k;li = Kmn mk nl if m+ n 6= k + l: If Kmn 6= 0 (respectively, Kmn > 0) for each m;n 0; we say that f ngn=0 is an orthogonal polynomial set (in short, OPS) (respectively,a positive-de…nite OPS) relative to : It is obvious that there is an OB relative to if and only if there is a WOPS relative to : De…nition 2.2. A moment functional is quasi-de…nite (respectively, weakly quasi-de…nite) if there is an OPS (respectively, a WOPS) relative to : SOBOLEV ORTHOGONAL POLYNOMIALS AND PARTIAL DIFFERENTIAL EQUATIONS 3 From De…nition 2.1 and 2.2, we see that a PS f ngn=0 is an OPS (respectively, a positive-de…nite OPS) relative to if and only if h ; m n i = Hn mn and Hn := h ; n n i is a nonsingular (respectively, a positive-de…nite) diagonal matrix. For any PSf ngn=0, there is a unique moment functional , which is called the canonical moment functional of f ngn=0, de…ned by the conditions h ; 1i = 1; h ; mni = 0; m+ n 1: Note that if a PS f ngn=0 is an OB relative to , then is a constant multiple of canonical moment functional of f ngn=0: Although f ngn=0 is an OPS relative to ; its normalization fPngn=0 need not be an OPS relative to but fPngn=0 is an OB relative to . It is not easy to produce an OPS relative to , if any, from an OB relative to : Theorem 2.1. ([4, 6]) For any moment functional , the following statements are equivalent. (i) is quasi-de…nite. (ii) There is a unique monic OB fPngn=0 relative to . (iii) There is a monic OB fPngn=0 such that Hn := h ;PnPn i is nonsingular for all n 0. Theorem 2.2 (Favard’s Theorem). [11] Let f ngn=0 be a PS. Then the following statements are equivalent. (i) f ngn=0 is a WOPS relative to a quasi-de…nite moment functional : (ii) For n 0 and i = 1; 2, there are matrices Ani of order (n+1) (n+2), Bni of order (n+1) (n+1), and Cni of order (n+ 1) n such that (a) xi n = Ani n+1 +Bni n + Cni n 1 (here x1 = x; x2 = y) (b) rankCn = n+ 1, where Cn = (Cn1; Cn2): Lemma 2.3. Let be a moment functional and be a polynomial. Then we have (i) = 0 if and only if x = 0 or y = 0 (ii) ( )x = x + x and ( )y = y + y: Proof. (i): The proof is obvious. (ii) A computation shows that for any p 2 P; we have h( )x; pi = h ; pxi = h ; pxi = h ; ( p)x + xpi = h x; pi+ h ; xpi = h x; pi+ h x ; xpi = h x + x; pi ; which means ( )x = x + x: By a similar calculation, we have ( )y = y + y: If the partial di¤erential equation (1.1) has a PS f ngn=0 as solutions, then it must be of the form (2.3) Auxx + 2Buxy + Cuyy +Dux + Euy = (ax + d1x+ e1y + f1)uxx + (2axy + d2x+ e2y + f2)uxy + ay + d3x+ e3y + f3 uyy +(gx+ h1)ux + (gy + h2)uy = nu; where n = an(n 1) + dn: We say that the partial di¤erential equation (2.3) is admissible if m 6= n for m 6= n: (2.3) has a unique monic PS as solutions if and only if it is admissible. Theorem 2.4. [4] Let be the canonical moment functional of a PS f ngn=0: If f ngn=0 satis…es the partial di¤erential equation (2:3), then satis…es the equation (2.4) L [ ] = (A )xx + 2(B )xy + (C )yy (D )x (E )y = 0; where L [u] := (Au)xx +2(Bu)xy + (Cu)yy (Du)x (Eu)y is the formal Lagrange adjoint operator of L[ ]: 4 JEONGKEUN LEE AND L. L. LITTLEJOHN Furthermore, (2:4) has a unique solution up to a multiplication constant if the partial di¤erential equation (2:3) is admissible. Theorem 2.5. ([4, 6]) Let f ngn=0 be an OB relative to : Then the following statements are all equivalent. (i) f ngn=0 satis…es the partial di¤erential equation (2.3). (ii) satis…es the moment equations (2.5) M1[ ] := (A )x + (B )y D = 0; M2[ ] := (B )x + (C )y E = 0: Remark 2.1. For any moment functional ; L [ ] is written in the following form: L [ ] = (M1[ ])x + (M2[ ])y : This formula will be used in section 4: Theorem 2.6. [4] Let f ngn=0 be a PS satisfying the admissible partial di¤erential equation (2:3) and the canonical moment functional of f ngn=0: Then the following statements are equivalent. (i) f ngn=0 is a WOPS relative to : (ii) M1[ ] = 0: (iii) M2[ ] = 0: 3. Theory of Sobolev Orthogonal Polynomials in Two Variables We know that any moment functional de…nes a symmetric bilinear form '( ; ) on P P through the formula '(p; q) = h ; pqi : Conversely, a symmetric bilinear form can be generated by a moment functional provided some conditions are ful…lled. Theorem 3.1. Let '( ; ) be a symmetric bilinear form on P P. Then the following statements are equivalent. (i) there is a moment functional such that '(p; q) = h ; pqi for any p; q 2 P: (ii) '(xp; q) = '(p; xq) and '(yp; q) = '(p; yq) for any p; q 2 P: Proof. ()) It is obvious. (() De…ne a moment functional by h ; pi = '(p; 1); p 2 P: Then we have for any p; q 2 P '(p; q) = '(p; deg q X i+j=0 aijx y) = deg q X i+j=0 aij'(p; x y) = deg q X

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Matrix Pearson equations satisfied by Koornwinder weights in two variables

We consider Koornwinder’s method for constructing orthogonal polynomials in two variables from orthogonal polynomials in one variable. If semiclassical orthogonal polynomials in one variable are used, then Koornwinder’s construction generates semiclassical orthogonal polynomials in two variables. We consider two methods for deducing matrix Pearson equations for weight functions associated with ...

متن کامل

Orthogonal Polynomials Satisfying Partial Differential Equations Belonging to the Basic Class

We classify all partial di¤erential equations with polynomial coe¢ cients in x and y of the form A(x)uxx + 2B(x; y)uxy + C(y)uyy +D(x)ux + E(y)uy = nu; which has weak orthogonal polynomials as solutions and show that partial derivatives of all order are orthogonal. Also, we construct orthogonal polynomials in d-variables satisfying second order partial di¤erential equations in d-variables.

متن کامل

A fractional type of the Chebyshev polynomials for approximation of solution of linear fractional differential equations

In this paper we introduce a type of fractional-order polynomials based on the classical Chebyshev polynomials of the second kind (FCSs). Also we construct the operational matrix of fractional derivative of order $ gamma $ in the Caputo for FCSs and show that this matrix with the Tau method are utilized to reduce the solution of some fractional-order differential equations.

متن کامل

Second Order Difference Equations and Discrete Orthogonal Polynomials of Two Variables

The second order partial difference equation of two variables Du := A1,1(x)∆1∇1u+A1,2(x)∆1∇2u+ A2,1(x)∆2∇1u+ A2,2(x)∆2∇2u + B1(x)∆1u+ B2(x)∆2u = λu, is studied to determine when it has orthogonal polynomials as solutions. We derive conditions on D so that a weight function W exists for which WDu is self-adjoint and the difference equation has polynomial solutions which are orthogonal with respe...

متن کامل

Recurrences and explicit formulae for the expansion and connection coefficients in series of the product of two classical discrete orthogonal polynomials

Suppose that for an arbitrary function $f(x,y)$ of two discrete variables, we have the formal expansions. [f(x,y)=sumlimits_{m,n=0}^{infty }a_{m,n},P_{m}(x)P_{n}(y),] $$‎ ‎x^{m}P_{j}(x)=sumlimits_{n=0}^{2m}a_{m,,n}(j)P_{j+m-n}(x)‎,$$ ‎we find the coefficients $b_{i,j}^{(p,q,ell‎ ,‎,r)}$ in the expansion‎ $$‎ ‎x^{ell }y^{r},nabla _{x}^{p}nabla _{y}^{q},f(x,y)=x^{ell‎ ‎}y^{r}f^{(p,q)}(x,y) =sumli...

متن کامل

Sobolev Orthogonal Polynomials on a Simplex

The Jacobi polynomials on the simplex are orthogonal polynomials with respect to the weight function Wγ(x) = x γ1 1 · · ·x γd d (1− |x|)d+1 when all γi > −1 and they are eigenfunctions of a second order partial differential operator Lγ . The singular cases that some, or all, γ1, . . . , γd+1 are −1 are studied in this paper. Firstly a complete basis of polynomials that are eigenfunctions of Lγ ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007